Task-group Relatedness and Generalization Bounds for Regularized Multi-task Learning
نویسندگان
چکیده
In this paper, we study the generalization performance of regularized multi-task learning (RMTL) in a vector-valued framework, where MTL is considered as a learning process for vector-valued functions. We are mainly concerned with two theoretical questions: 1) under what conditions does RMTL perform better with a smaller task sample size than STL? 2) under what conditions is RMTL generalizable and can guarantee the consistency of each task during simultaneous learning? In particular, we investigate two types of task-group relatedness: the observed discrepancy-dependence measure (ODDM) and the empirical discrepancy-dependence measure (EDDM), both of which detect the dependence between two groups of multiple related tasks (MRTs). We then introduce the Cartesian productbased uniform entropy number (CPUEN) to measure the complexities of vector-valued function classes. By applying the specific deviation and the symmetrization inequalities to the vector-valued framework, we obtain the generalization bound for RMTL, which is the upper bound of the joint probability of the event that there is at least one task with a large empirical discrepancy between the expected and empirical risks. Finally, we present a sufficient condition to guarantee the consistency of each task in the simultaneous learning process, and we discuss how task relatedness affects the generalization performance of RMTL. Our theoretical findings answer the aforementioned two questions.
منابع مشابه
Exploiting Task Relatedness for Multiple Task Learning
The approach of learning of multiple ”related” tasks simultaneously has proven quite successful in practice; however, theoretical justification for this success has remained elusive. The starting point of previous work on multiple task learning has been that the tasks to be learnt jointly are somehow ”algorithmically related”, in the sense that the results of applying a specific learning algori...
متن کاملExploiting Task Relatedness for Mulitple Task Learning
The approach of learning of multiple ”related” tasks simultaneously has proven quite successful in practice; however, theoretical justification for this success has remained elusive. The starting point of previous work on multiple task learning has been that the tasks to be learnt jointly are somehow ”algorithmically related”, in the sense that the results of applying a specific learning algori...
متن کاملHierarchical Multi-Task Learning: a Cascade Approach Based on the Notion of Task Relatedness
Multi-task learning can be shown to improve the generalization performance of single tasks under certain conditions. Typically, the algorithmic and theoretical analysis of multi-task learning deals with a two-level structure, including a group of tasks and a single task. In many situations, however, it is beneficial to consider varying degrees of relatedness among tasks, assuming that some task...
متن کاملMulti-Task Learning and Algorithmic Stability
In this paper, we study multi-task algorithms from the perspective of the algorithmic stability. We give a definition of the multi-task uniform stability, a generalization of the conventional uniform stability, which measures the maximum difference between the loss of a multi-task algorithm trained on a data set and that of the multitask algorithm trained on the same data set but with a data po...
متن کاملThe Rademacher Complexity of Linear Transformation Classes
Bounds are given for the empirical and expected Rademacher complexity of classes of linear transformations from a Hilbert space H to a nite dimensional space. The results imply generalization guarantees for graph regularization and multi-task subspace learning. 1 Introduction Rademacher averages have been introduced to learning theory as an e¢ cient complexity measure for function classes, mot...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1408.6617 شماره
صفحات -
تاریخ انتشار 2014